Expressive Avatar Interactions for Safe Social and Interactive Online Dancing Experiences
Stage 11 - VR Track - Confex Level 2
-All-Mental HealthResearch & EducationTech & CodingVR / XR
Information
Background:
The proposed workshop is organized by the contributors to the Carousel Research project supported by the European Union.
The project combines AI and VR to combat loneliness by allowing individuals, who are isolated in their respective living spaces, to interact socially and enjoy themselves dancing together in the Metaverse.
Synopsis:
This workshop “Expressive Avatar Interactions for Social and Interactive Online Dancing Experiences” brings together experts from the Carousel project and beyond to explore the scientific, technical and societal challenges and opportunities for real-time interaction in the Metaverse, between individuals represented by expressive dancing avatars.
The workshop will address both the technical and the social aspects of dancing in VR along with the two main scenarios investigated by the project:
- real-time interactions between individuals dancing from different locations,
- real-time interaction with AI controlled virtual avatars.
Next to presentations by known experts on their respective scientific, artistic and technical domains, the workshop will also feature hands-on demonstrations by professional dancers – and possibly volunteers from the audience – of low-latency synchronized dancing between human and AI-animated avatars in various gentle, inspiring, virtual worlds specially adapted for a relaxing and inspiring dance experience.
The organizing team:
The workshop will be co-organized by Carousel project participants. The presenters combine experience in deploying character animation for video games including Harry Potter, and Destiny, films including Cars, Ratatouille, Finding Dory, and Star Wars, with unique technical expertise on Animation, Latency Reduction, Motion Capture, Expressive Avatars, AI Agents, Simulated Reality and Artistic Design of virtual worlds.
Target Audiences
Researchers/Developers/Designers/Artists
Experience Level
Intermediate
Key Take Aways
The ability and desire to dance is deeply rooted in human. People from all cultures socialize dancing together to celebrate or -just - have fun moving their bodies.
Yet there are many reasons preventing individuals from dancing together, such as physical or mental issues, but also simply lack of time or geographical constraints. By providing the ability to dance online, some of these problems disappear or become less acute.
For a realistic and enjoyable dance experience a great number of technical and scientific hurdles have been addressed. Among them: capture of body poses and facial expressions using affordable hardware, reduction of latency, expressive avatars, virtual dancers by generative AI, and community driven design of a gentle, friendly, dance world.
All the innovations presented are described in Open Access Scientific papers and Open-Source code is on GitHub.
Following a short general introduction about the Carousel project by Pieter van der Linden, the workshop will feature the following presentations:
- Prof. Kenny Mitchell present DanceGraph, the low latency social interactive dance engine. He will address its architecture, telemetry (DanceMark), body expressive dance interactions (Moodflow), generative AI for creating dance worlds (HoloJig), Emotional talking avatars.
- Carmen Mac Williams will present the lessons learned designing and testing a “Plantworld” for social dancing to combat loneliness and prevent mental health problems.
- Adas Slezas will present their experience on using Motion Matching for animating an interactive VR dance partner. In their talk, they will focus on the conceptual aspects of how to use and think about Motion Matching and similar systems.
- Noshaba Cheema will present use of Neural Networks for creating realistic autonomous dancing avatars.
Hands-on demonstrations will illustrate the talks. Our demos will also be features on our booth on Gamescom.
Session Type
Talk